How do you calculate accuracy?

Accuracy is typically stated using two terms: ppm of reading and ppm of range
ppm stands for Parts per million. ppm is used instead of percent when the magnitude of the numbers are small. (10,000 PPM = 1 %)

These two terms are the zero and gain terms. Where ppm of reading is the gain term and ppm of range is the zero term.

Here is a an example of how to calculate accuracy when measuring a 0.5V signal on the 1V range using the 90 day specification.

90 day spec = 20ppm of reading + 20ppm of range


Our reading will nominally be 0.5 volts. What is 1ppm of 0.5V? Just divide the 0.5V by 1million… the answer is 0.5uV. Now multiply that by the ppm of reading which is 20ppm. 0.5uV x 20 = 10uV.


What is 1ppm of the range (1V in this case)? Just divide the 1V range by 1 million… the answer is 1uV. Now multiply that by the ppm of range which is 20ppm and the answer is 20uV.


Now add the two terms together.

20uV + 10uV = 30uV

or 30uV out of 1V which is .000030/1 = 0.003% or 30ppm.


The reported measurement will be within specification so long as it is in the range of 0.499970 to 0.500030